309 research outputs found

    Exact Penalty Functions for Mathematical Programs with Linear Complementarity Constraints

    Get PDF
    We establish a new general exact penalty function result for a constrained optimization problem and apply this result to a mathematical program with linear complementarity constraints

    Meta-stable memory in an artificial immune network

    Get PDF
    Abstract. This paper describes an artificial immune system algorithm which implements a fairly close analogue of the memory mechanism proposed by Jerne(1) (usually known as the Immune Network Theory). The algorithm demonstrates the ability of these types of network to produce meta-stable structures representing populated regions of the antigen space. The networks produced retain their structure indefinitely and capture inherent structure within the sets of antigens used to train them. Results from running the algorithm on a variety of data sets are presented and shown to be stable over long time periods and wide ranges of parameters. The potential of the algorithm as a tool for multivariate data analysis is also explored.

    Robust linear and support vector regression

    Full text link

    Clustering problems in optimization models

    Full text link
    We discuss a variety of clustering problems arising in combinatorial applications and in classifying objects into homogenous groups. For each problem we discuss solution strategies that work well in practice. We also discuss the importance of careful modelling in clustering problems.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/44350/1/10614_2004_Article_BF00121636.pd

    Templates for Convex Cone Problems with Applications to Sparse Signal Recovery

    Full text link
    This paper develops a general framework for solving a variety of convex cone problems that frequently arise in signal processing, machine learning, statistics, and other fields. The approach works as follows: first, determine a conic formulation of the problem; second, determine its dual; third, apply smoothing; and fourth, solve using an optimal first-order method. A merit of this approach is its flexibility: for example, all compressed sensing problems can be solved via this approach. These include models with objective functionals such as the total-variation norm, ||Wx||_1 where W is arbitrary, or a combination thereof. In addition, the paper also introduces a number of technical contributions such as a novel continuation scheme, a novel approach for controlling the step size, and some new results showing that the smooth and unsmoothed problems are sometimes formally equivalent. Combined with our framework, these lead to novel, stable and computationally efficient algorithms. For instance, our general implementation is competitive with state-of-the-art methods for solving intensively studied problems such as the LASSO. Further, numerical experiments show that one can solve the Dantzig selector problem, for which no efficient large-scale solvers exist, in a few hundred iterations. Finally, the paper is accompanied with a software release. This software is not a single, monolithic solver; rather, it is a suite of programs and routines designed to serve as building blocks for constructing complete algorithms.Comment: The TFOCS software is available at http://tfocs.stanford.edu This version has updated reference
    • …
    corecore